Sparsity and Error Analysis of Empirical Feature-Based Regularization Schemes

نویسندگان

  • Xin Guo
  • Jun Fan
  • Ding-Xuan Zhou
چکیده

We consider a learning algorithm generated by a regularization scheme with a concave regularizer for the purpose of achieving sparsity and good learning rates in a least squares regression setting. The regularization is induced for linear combinations of empirical features, constructed in the literatures of kernel principal component analysis and kernel projection machines, based on kernels and samples. In addition to the separability of the involved optimization problem caused by the empirical features, we carry out sparsity and error analysis, giving bounds in the norm of the reproducing kernel Hilbert space, based on a priori conditions which do not require assumptions on sparsity in terms of any basis or system. In particular, we show that as the concave exponent q of the concave regularizer increases to 1, the learning ability of the algorithm improves. Some numerical simulations for both artificial and real MHC-peptide binding data involving the ` regularizer and the SCAD penalty are presented to demonstrate the sparsity and error analysis.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparsity Based Regularization

In previous lectures, we saw how regularization can be used to restore the well-posedness of the empirical risk minimization (ERM) problem. We also derived algorithms that use regularization to impose smoothness assumptions on the solution space (as in the case of Tikhonov regularization) or introduce additional structure by confining the solution space to low dimensional manifolds (manifold re...

متن کامل

Direct Sparsity Optimization Based Feature Selection for Multi-Class Classification

A novel sparsity optimization method is proposed to select features for multi-class classification problems by directly optimizing a l2,p -norm ( 0 < p ≤ 1 ) based sparsity function subject to data-fitting inequality constraints to obtain large between-class margins. The direct sparse optimization method circumvents the empirical tuning of regularization parameters in existing feature selection...

متن کامل

Efficient Learning of Sparse Conditional Random Fields for Supervised Sequence Labelling

Conditional Random Fields (CRFs) constitute a popular and efficient approach for supervised sequence labelling. CRFs can cope with large description spaces and can integrate some form of structural dependency between labels. In this contribution, we address the issue of efficient feature selection for CRFs based on imposing sparsity through an L1 penalty. We first show how sparsity of the param...

متن کامل

3D Inversion of Magnetic Data through Wavelet based Regularization Method

This study deals with the 3D recovering of magnetic susceptibility model by incorporating the sparsity-based constraints in the inversion algorithm. For this purpose, the area under prospect was divided into a large number of rectangular prisms in a mesh with unknown susceptibilities. Tikhonov cost functions with two sparsity functions were used to recover the smooth parts as well as the sharp ...

متن کامل

`1 Regularization in Infinite Dimensional Feature Spaces

In this paper we discuss the problem of fitting `1 regularized prediction models in infinite (possibly non-countable) dimensional feature spaces. Our main contributions are: a. Deriving a generalization of `1 regularization based on measures which can be applied in non-countable feature spaces; b. Proving that the sparsity property of `1 regularization is maintained in infinite dimensions; c. D...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Journal of Machine Learning Research

دوره 17  شماره 

صفحات  -

تاریخ انتشار 2016